A test for independence via Bayesian nonparametric estimation of mutual information

نویسندگان

چکیده

Mutual information is a well-known tool to measure the mutual dependence between variables. In this article, Bayesian nonparametric estimator of established by means Dirichlet process and k-nearest neighbour distance. As result, an easy-to-implement test independence introduced through relative belief ratio. Several theoretical properties approach are presented. The procedure illustrated various examples compared with its frequentist counterpart. L'information mutuelle est un outil classique et bien connu qui sert à mesurer la dépendance entre Les auteurs de ce travail proposent d'utiliser le processus méthode des k-proches voisins pour construire estimateur bayésien non paramétrique cette information. Ainsi, en ayant recours au rapport croyance relative, ils introduisent d'indépendance facile implémenter. En plus d'une illustration sur divers exemples pratiques, une comparaison avec analogue fréquentiste, plusieurs propriétés théoriques l'approche sont présentées.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Nonparametric independence testing via mutual information

We propose a test of independence of two multivariate random vectors, given a sample from the underlying population. Our approach, which we call MINT, is based on the estimation of mutual information, whose decomposition into joint and marginal entropies facilitates the use of recently-developed efficient entropy estimators derived from nearest neighbour distances. The proposed critical values,...

متن کامل

Exact Test of Independence Using Mutual Information

Using a recently discovered method for producing random symbol sequences with prescribed transition counts, we present an exact null hypothesis significance test (NHST) for mutual information between two random variables, the null hypothesis being that the mutual information is zero (i.e., independence). The exact tests reported in the literature assume that data samples for each variable are s...

متن کامل

Mutual Information as a Bayesian Measure of Independence

The problem of hypothesis testing is examined from both the historical and the Bayesian points of view in the case that sampling is from an underlying joint probability distribution and the hypotheses tested for are those of independence and dependence of the underlying distribution. Exact results for the Bayesian method are provided. Asymptotic Bayesian results and historical method quantities...

متن کامل

A nonparametric independence test using random permutations

We propose a new nonparametric test for the supposition of independence between two continuous random variables X and Y. Given a sample of (X,Y ), the test is based on the size of the longest increasing subsequence of the permutation which maps the ranks of the X observations to the ranks of the Y observations. We identify the independence assumption between the two continuous variables with th...

متن کامل

A Bayesian Nonparametric Test for Minimal Repair

A Bayesian Nonparametric Test for Minimal Repair Li Li, Timothy Hanson, Paul Damien & Elmira Popova a Department of Statistics, University of South Carolina, Columbia, SC 29201 (; ) b McCombs School of Business, University of Texas at Austin, Austin, TX 78712 () c Department Mechanical Engineering, University of Texas at Austin, Austin, TX 78712 () Accepted author version posted online: 20 Sep ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Canadian journal of statistics

سال: 2021

ISSN: ['0319-5724', '1708-945X']

DOI: https://doi.org/10.1002/cjs.11645